2,420 research outputs found

    Storage Capacity of Extremely Diluted Hopfield Model

    Full text link
    The storage capacity of the extremely diluted Hopfield Model is studied by using Monte Carlo techniques. In this work, instead of diluting the synapses according to a given distribution, the dilution of the synapses is obtained systematically by retaining only the synapses with dominant contributions. It is observed that by using the prescribed dilution method the critical storage capacity of the system increases with decreasing number of synapses per neuron reaching almost the value obtained from mean-field calculations. It is also shown that the increase of the storage capacity of the diluted system depends on the storage capacity of the fully connected Hopfield Model and the fraction of the diluted synapses.Comment: Latex, 14 pages, 4 eps figure

    Spontaneous structure formation in a network of chaotic units with variable connection strengths

    Full text link
    As a model of temporally evolving networks, we consider a globally coupled logistic map with variable connection weights. The model exhibits self-organization of network structure, reflected by the collective behavior of units. Structural order emerges even without any inter-unit synchronization of dynamics. Within this structure, units spontaneously separate into two groups whose distinguishing feature is that the first group possesses many outwardly-directed connections to the second group, while the second group possesses only few outwardly-directed connections to the first. The relevance of the results to structure formation in neural networks is briefly discussed.Comment: 4 pages, 3 figures, REVTe

    Bump formation in a binary attractor neural network

    Full text link
    This paper investigates the conditions for the formation of local bumps in the activity of binary attractor neural networks with spatially dependent connectivity. We show that these formations are observed when asymmetry between the activity during the retrieval and learning is imposed. Analytical approximation for the order parameters is derived. The corresponding phase diagram shows a relatively large and stable region, where this effect is observed, although the critical storage and the information capacities drastically decrease inside that region. We demonstrate that the stability of the network, when starting from the bump formation, is larger than the stability when starting even from the whole pattern. Finally, we show a very good agreement between the analytical results and the simulations performed for different topologies of the network.Comment: about 14 page

    Apollo 7 retrofire and reentry of service propulsion module. Further study of Intelsat 2 F-2 apogee burn

    Get PDF
    Photography of Apollo 7 retrofire and service propulsion module reentry and apogee burn of Intelsat 2 F-2 satellit

    Memory Aware Synapses: Learning what (not) to forget

    Full text link
    Humans can learn in a continuous manner. Old rarely utilized knowledge can be overwritten by new incoming information while important, frequently used knowledge is prevented from being erased. In artificial learning systems, lifelong learning so far has focused mainly on accumulating knowledge over tasks and overcoming catastrophic forgetting. In this paper, we argue that, given the limited model capacity and the unlimited new information to be learned, knowledge has to be preserved or erased selectively. Inspired by neuroplasticity, we propose a novel approach for lifelong learning, coined Memory Aware Synapses (MAS). It computes the importance of the parameters of a neural network in an unsupervised and online manner. Given a new sample which is fed to the network, MAS accumulates an importance measure for each parameter of the network, based on how sensitive the predicted output function is to a change in this parameter. When learning a new task, changes to important parameters can then be penalized, effectively preventing important knowledge related to previous tasks from being overwritten. Further, we show an interesting connection between a local version of our method and Hebb's rule,which is a model for the learning process in the brain. We test our method on a sequence of object recognition tasks and on the challenging problem of learning an embedding for predicting triplets. We show state-of-the-art performance and, for the first time, the ability to adapt the importance of the parameters based on unlabeled data towards what the network needs (not) to forget, which may vary depending on test conditions.Comment: ECCV 201

    Reflection Positivity and Monotonicity

    Get PDF
    We prove general reflection positivity results for both scalar fields and Dirac fields on a Riemannian manifold, and comment on applications to quantum field theory. As another application, we prove the inequality CD≤CNC_D \leq C_N between Dirichlet and Neumann covariance operators on a manifold with a reflection.Comment: 11 page

    Rotation of Late-Type Stars in Praesepe with K2

    Get PDF
    We have Fourier analyzed 941 K2 light curves of likely members of Praesepe, measuring periods for 86% and increasing the number of rotation periods (P) by nearly a factor of four. The distribution of P vs. (V-K), a mass proxy, has three different regimes: (V-K)<1.3, where the rotation rate rapidly slows as mass decreases; 1.3<(V-K)<4.5, where the rotation rate slows more gradually as mass decreases; and (V-K)>4.5, where the rotation rate rapidly increases as mass decreases. In this last regime, there is a bimodal distribution of periods, with few between ∼\sim2 and ∼\sim10 days. We interpret this to mean that once M stars start to slow down, they do so rapidly. The K2 period-color distribution in Praesepe (∼\sim790 Myr) is much different than in the Pleiades (∼\sim125 Myr) for late F, G, K, and early-M stars; the overall distribution moves to longer periods, and is better described by 2 line segments. For mid-M stars, the relationship has similarly broad scatter, and is steeper in Praesepe. The diversity of lightcurves and of periodogram types is similar in the two clusters; about a quarter of the periodic stars in both clusters have multiple significant periods. Multi-periodic stars dominate among the higher masses, starting at a bluer color in Praesepe ((V-K)∼\sim1.5) than in the Pleiades ((V-K)∼\sim2.6). In Praesepe, there are relatively more light curves that have two widely separated periods, ΔP>\Delta P >6 days. Some of these could be examples of M star binaries where one star has spun down but the other has not.Comment: Accepted by Ap

    Effects of Synaptic and Myelin Plasticity on Learning in a Network of Kuramoto Phase Oscillators

    Get PDF
    Models of learning typically focus on synaptic plasticity. However, learning is the result of both synaptic and myelin plasticity. Specifically, synaptic changes often co-occur and interact with myelin changes, leading to complex dynamic interactions between these processes. Here, we investigate the implications of these interactions for the coupling behavior of a system of Kuramoto oscillators. To that end, we construct a fully connected, one-dimensional ring network of phase oscillators whose coupling strength (reflecting synaptic strength) as well as conduction velocity (reflecting myelination) are each regulated by a Hebbian learning rule. We evaluate the behavior of the system in terms of structural (pairwise connection strength and conduction velocity) and functional connectivity (local and global synchronization behavior). We find that for conditions in which a system limited to synaptic plasticity develops two distinct clusters both structurally and functionally, additional adaptive myelination allows for functional communication across these structural clusters. Hence, dynamic conduction velocity permits the functional integration of structurally segregated clusters. Our results confirm that network states following learning may be different when myelin plasticity is considered in addition to synaptic plasticity, pointing towards the relevance of integrating both factors in computational models of learning.Comment: 39 pages, 15 figures This work is submitted in Chaos: An Interdisciplinary Journal of Nonlinear Scienc

    Learning by message-passing in networks of discrete synapses

    Get PDF
    We show that a message-passing process allows to store in binary "material" synapses a number of random patterns which almost saturates the information theoretic bounds. We apply the learning algorithm to networks characterized by a wide range of different connection topologies and of size comparable with that of biological systems (e.g. n≃105−106n\simeq10^{5}-10^{6}). The algorithm can be turned into an on-line --fault tolerant-- learning protocol of potential interest in modeling aspects of synaptic plasticity and in building neuromorphic devices.Comment: 4 pages, 3 figures; references updated and minor corrections; accepted in PR
    • …
    corecore